Videos are a powerful source of communication adopted in several contexts and used for both benign and malicious purposes (e.g., education vs. reputation damage). Nowadays, realistic video manipulation strategies like deepfake generators constitute a severe threat to our society in term of misinformation. While the wide range of the current research focuses on deepfake detection as a binary task, the identification of a real video among a pool of deepfakes sharing the same origin is not widely investigated. While the pool task might be more rare in real-life compared to the binary one, outcomes that derives from these analyses might let us better understand deepfake behaviours, benefiting binary deep fake detection as well. In this paper, we address the less investigated scenario by investigating the role of Photo Response Non-Uniformity (PRNU) in deepfake detection. Our analysis, in agreement with prior studies, shows that PRNU can be a valuable source to identify deepfake videos. In particular, we found that unique PRNU characteristics exist to distinguish real videos from their deepfake versions: real video autocorrelations tend to be lower compared to their deepfakes versions. Motivated by this, we propose PRaNA, a training-free strategy that leverages PRNU autocorrelation. Our results on three well-known datasets confirm our algorithm's robustness and transferability, with accuracy up to 66% when considering one real video in a pool of four deepfakes using the real video as a source, and up to 80% when only one deepfake is considered. Our work aims to open different strategies to counter deepfake diffusion.

PRaNA: PRNU-based Technique to Tell Real and Deepfake Videos Apart / Amerini, I.; Conti, M.; Giacomazzi, P.; Pajola, L.. - 2022-:(2022), pp. 1-7. (Intervento presentato al convegno 2022 International Joint Conference on Neural Networks, IJCNN 2022 tenutosi a ita) [10.1109/IJCNN55064.2022.9892413].

PRaNA: PRNU-based Technique to Tell Real and Deepfake Videos Apart

Amerini I.;
2022

Abstract

Videos are a powerful source of communication adopted in several contexts and used for both benign and malicious purposes (e.g., education vs. reputation damage). Nowadays, realistic video manipulation strategies like deepfake generators constitute a severe threat to our society in term of misinformation. While the wide range of the current research focuses on deepfake detection as a binary task, the identification of a real video among a pool of deepfakes sharing the same origin is not widely investigated. While the pool task might be more rare in real-life compared to the binary one, outcomes that derives from these analyses might let us better understand deepfake behaviours, benefiting binary deep fake detection as well. In this paper, we address the less investigated scenario by investigating the role of Photo Response Non-Uniformity (PRNU) in deepfake detection. Our analysis, in agreement with prior studies, shows that PRNU can be a valuable source to identify deepfake videos. In particular, we found that unique PRNU characteristics exist to distinguish real videos from their deepfake versions: real video autocorrelations tend to be lower compared to their deepfakes versions. Motivated by this, we propose PRaNA, a training-free strategy that leverages PRNU autocorrelation. Our results on three well-known datasets confirm our algorithm's robustness and transferability, with accuracy up to 66% when considering one real video in a pool of four deepfakes using the real video as a source, and up to 80% when only one deepfake is considered. Our work aims to open different strategies to counter deepfake diffusion.
2022
2022 International Joint Conference on Neural Networks, IJCNN 2022
Deepfake detection; photo response non-uniformity; video forensics
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
PRaNA: PRNU-based Technique to Tell Real and Deepfake Videos Apart / Amerini, I.; Conti, M.; Giacomazzi, P.; Pajola, L.. - 2022-:(2022), pp. 1-7. (Intervento presentato al convegno 2022 International Joint Conference on Neural Networks, IJCNN 2022 tenutosi a ita) [10.1109/IJCNN55064.2022.9892413].
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1662841
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 1
social impact